Erratum for: On Markov Chains, Attractors, and Neural Nets
نویسنده
چکیده
There are two examples given in section 3.3 of the paper (pages 349– 352). The first of the two examples is a three matrix product cycle in four states. Here one may label this three matrix product cycle by assigning letters to each matrix. Let the cycle be . . .ABC . . . . A stationary cycle of state probability distributions and two state paths are given that do correspond to the product cycle but more properly, or more naturally, to the slightly rotated version of the same cycle given by . . .BCA . . . . Similarly, the second example is a four matrix product cycle in four states that one may label . . .DEFG . . . . The stationary cycle of distributions and the state paths given more properly belong to the slightly rotated version of the same cycle labeled . . .EFGD . . . . This means, essentially, that the coin is flipped at the end of the cycle just before the next cycle begins instead of at the beginning of the cycle. This subtle change in interpretation does not detract from the main thrust of the presentation in this section. As was indicated in [4], all cycle transition epochs of stationary paths are regeneration points and once stationarity is reached it does not really matter which matrix starts the cycle. But the stationary cycle of distributions and state paths given should be properly associated to their corresponding transition matrix cycle order. For the sake of consistency, the version of the Hajnal qualification given in [4] was used. In reality this qualification can be relaxed a little bit. All that is required is that any one matrix in the cycle be a regular scrambling matrix [17]. Thus, the matrices A and D can be first, middle, or last in the cycle. It does not make much difference because the paths are shifted anyway when all rotated matrix products are considered.
منابع مشابه
On Markov Chains, Attractors, and Neural Nets
The work presented here relates specific aspects of the theory of Markov chains with some concepts that arise in the theory of complexity. This is done in order to support and help explain more recent hypotheses on how networks of neurons process information. The reader is referred to [1] where a good treatment of the subject of Markov chains is offered, and to [2] where a good introductory rev...
متن کاملEmpirical Bayes Estimation in Nonstationary Markov chains
Estimation procedures for nonstationary Markov chains appear to be relatively sparse. This work introduces empirical Bayes estimators for the transition probability matrix of a finite nonstationary Markov chain. The data are assumed to be of a panel study type in which each data set consists of a sequence of observations on N>=2 independent and identically dis...
متن کاملDependability Models Based on Petri Nets and Markov Chains
This paper shows a way to use stochastic Petri nets as formal availability models instead of Markov chains. Advantages of stochastic Petri nets over Markov models are illustrated on example models. The ability of stochastic Petri nets to represent the structure of modelled design and its use in further research is introduced.
متن کاملEvaluation of First and Second Markov Chains Sensitivity and Specificity as Statistical Approach for Prediction of Sequences of Genes in Virus Double Strand DNA Genomes
Growing amount of information on biological sequences has made application of statistical approaches necessary for modeling and estimation of their functions. In this paper, sensitivity and specificity of the first and second Markov chains for prediction of genes was evaluated using the complete double stranded DNA virus. There were two approaches for prediction of each Markov Model parameter,...
متن کاملTaylor Expansion for the Entropy Rate of Hidden Markov Chains
We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Complex Systems
دوره 12 شماره
صفحات -
تاریخ انتشار 2000